41,807 research outputs found
Signature-Based Gr\"obner Basis Algorithms --- Extended MMM Algorithm for computing Gr\"obner bases
Signature-based algorithms is a popular kind of algorithms for computing
Gr\"obner bases, and many related papers have been published recently. In this
paper, no new signature-based algorithms and no new proofs are presented.
Instead, a view of signature-based algorithms is given, that is,
signature-based algorithms can be regarded as an extended version of the famous
MMM algorithm. By this view, this paper aims to give an easier way to
understand signature-based Gr\"obner basis algorithms
Continual Local Training for Better Initialization of Federated Models
Federated learning (FL) refers to the learning paradigm that trains machine
learning models directly in the decentralized systems consisting of smart edge
devices without transmitting the raw data, which avoids the heavy communication
costs and privacy concerns. Given the typical heterogeneous data distributions
in such situations, the popular FL algorithm \emph{Federated Averaging}
(FedAvg) suffers from weight divergence and thus cannot achieve a competitive
performance for the global model (denoted as the \emph{initial performance} in
FL) compared to centralized methods. In this paper, we propose the local
continual training strategy to address this problem. Importance weights are
evaluated on a small proxy dataset on the central server and then used to
constrain the local training. With this additional term, we alleviate the
weight divergence and continually integrate the knowledge on different local
clients into the global model, which ensures a better generalization ability.
Experiments on various FL settings demonstrate that our method significantly
improves the initial performance of federated models with few extra
communication costs.Comment: This paper has been accepted to 2020 IEEE International Conference on
Image Processing (ICIP 2020
The F5 Algorithm in Buchberger's Style
The famous F5 algorithm for computing \gr basis was presented by Faug\`ere in
2002. The original version of F5 is given in programming codes, so it is a bit
difficult to understand. In this paper, the F5 algorithm is simplified as F5B
in a Buchberger's style such that it is easy to understand and implement. In
order to describe F5B, we introduce F5-reduction, which keeps the signature of
labeled polynomials unchanged after reduction. The equivalence between F5 and
F5B is also shown. At last, some versions of the F5 algorithm are illustrated
- …